Useful Facts about the Kullback-Leibler discrimination distance
نویسنده
چکیده
This report contains a list of some of the more prominent properties and theorems concerning the Kullback-Leibler (KL) discrimination distance. A brief discussion is also provided indicating the type of problems in which the KL distance has been applied. References are provided for the reader’s convenience. 1 Definition For two continuous density functions f1(x), f2(x) the Kullback-Leibler (KL) distance between f1 relative to f2 is defined as the expected value of the likelihood ratio with respect to f1. D(f1 ‖ f2) = E1[− log(L)] (1) = ∫ f1(x) log f1(x) f2(x) dx (2) Here, L denotes the likelihood ratio, f2(x)/f1(x). To ensure the existence of the integral, we assume that the two densities are absolutely continuous with respect to one another (meaning that we assume they share the same support). Let X be a discrete random variable defined on the discrete outcome space X and consider two probability mass functions p(x), x ∈ X and q(x), x ∈ X . Then the KL distance between p(x) relative to q(x) is defined as, D(p ‖ q) = Ep[− logL] (3) = ∑ x∈X p(x) log p(x) q(x) (4) where L again denotes the likelihood ratio q(x)/p(x).
منابع مشابه
Using Kullback-Leibler distance for performance evaluation of search designs
This paper considers the search problem, introduced by Srivastava cite{Sr}. This is a model discrimination problem. In the context of search linear models, discrimination ability of search designs has been studied by several researchers. Some criteria have been developed to measure this capability, however, they are restricted in a sense of being able to work for searching only one possibl...
متن کاملModel Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملComparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil
In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...
متن کاملInterval Entropy and Informative Distance
The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measure of discrepancy between two lifetime distributions at the interval of time in base of Kullback-...
متن کاملImage Recognition Using Kullback-Leibler Information Discrimination
The problem of automatic image recognition based on the minimum information discrimination principle is formulated and solved. Color histograms comparison in the Kullback–Leibler information metric is proposed. It’s combined with method of directed enumeration alternatives as opposed to complete enumeration of competing hypotheses. Results of an experimental study of the Kullback-Leibler discri...
متن کامل